Software Development
Google BERT Best Practices and Examples
AI Practioner: BERT Best Practices & Design Considerations
AI Practioner: Practical BERT Examples

AI Practioner: BERT Best Practices & Design Considerations

Course Number:
it_aibtbpdj_01_enus
Lesson Objectives

AI Practioner: BERT Best Practices & Design Considerations

  • discover the key concepts covered in this course
  • recall traditional natural language processing techniques and approaches
  • describe the limitations of traditional natural language processing techniques and list potential breakthroughs
  • define the terms "attention" and "transformer" as they relate to natural language processing
  • specify the role of natural language processing techniques like BERT
  • describe how utilizing BERT techniques helps improve search quality
  • outline how BERT techniques facilitate context specificity
  • list ways of using BERT techniques for search engine optimization
  • describe how masking is used in BERT
  • demonstrate how to do data augmentation using masking and BERT in Python
  • illustrate how to do text tokenization using BERT in Python
  • show how to do text encoding using BERT in Python
  • define a BERT model in Python and create and compile the BERT layer using TensorFlow
  • train a BERT model in Python and identify the various hyperparameters for BERT
  • demonstrate how to do data prediction using BERT in Python, load a trained BERT model, create the sample data, and predict using the model
  • describe how the use of the TensorFlow package can advance BERT techniques
  • summarize the key concepts covered in this course

Overview/Description

Bidirectional Encoder Representations from Transformers (BERT), a natural language processing technique, takes the capabilities of language AI systems to great heights. Google's BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course, you'll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques, like BERT. You'll identify the terms "attention" and "transformer" and how they relate to NLP. You'll then examine a series of real-life applications of BERT, such as in SEO and masking. Next, you'll work with an NLP pipeline utilizing BERT in Python for various tasks, namely, text tokenization and encoding, model definition and training, and data augmentation and prediction. Finally, you'll recognize the benefits of using BERT and TensorFlow together.



Target

Prerequisites: none

AI Practioner: Practical BERT Examples

Course Number:
it_aibtbpdj_02_enus
Lesson Objectives

AI Practioner: Practical BERT Examples

  • discover the key concepts covered in this course
  • name practical approaches to improving search using BERT
  • describe how BERT functions inside a search engine
  • demonstrate how BERT can be used to search the text of a given document
  • describe how we can use BERT for next sentence prediction
  • use BERT and Python for next sentence prediction via a PyTorch implementation of BERT
  • outline how BERT can be used for sequence classification
  • work with BERT to implement a sequence classifier
  • describe how multiple-choice reading comprehension can be done using BERT
  • use BERT and Python to implement multiple choice examples via a PyTorch implementation of BERT
  • outline how to utilize BERT for token classification
  • work with BERT to implement a token classifier
  • describe how to develop a question-answering machine using BERT
  • work with BERT to implement a question-answering machine
  • outline some fundamental guidelines for content optimization using BERT
  • summarize the key concepts covered in this course

Overview/Description

Bidirectional Encoder Representations from Transformers (BERT) can be implemented in various ways, and it is up to AI practitioners to decide which one is the best for a particular product. It is also essential to recognize all of BERT's capabilities and its full potential in NLP.

In this course, you'll outline the theoretical approaches to several BERT use cases before illustrating how to implement each of them. In full, you'll learn how to use BERT for search engine optimization, sentence prediction, sentence classification, token classification, and question answering, implementing a simple example for each use case discussed. Lastly, you'll examine some fundamental guidelines for using BERT for content optimization.



Target

Prerequisites: none

Close Chat Live